Web Survey Bibliography
Relevance & Research Question: Survey length is an important factor that researchers have to consider when designing questionnaires. Longer interviews are assumed to impose greater cognitive burden on respondents, which may have a negative impact on data quality. In the course of long surveys, respondents may become fatigued of answering questions and may be more likely to use satisficing response strategies to cope with the cognitive demands. Furthermore, longer surveys lead to increasing costs for questionnaire programming and interviewing. Despite the impact of interview duration on data quality and costs, survey designers are often uncertain about the length of their survey and only apply rules of thumb, if any, to predict survey length. The research project presented in this article investigates how item properties and respondent characteristics influence item-level response times to web survey questions. The project builds on the response time analysis carried out by Yan and Tourangeau (2008) and other studies on response times and interview duration and examines whether their findings can be replicated using a different dataset. Finally, the development of a tool for response time prediction is discussed as possible use of the results.
Methods & Data: The analysis is based on data of the GESIS Online Panel Pilot, a probability-based online panel which consists of German-speaking, Internet-using adults living in Germany. The survey is suitable for studies on response times because it contains a large variety of question types on multiple topics. Response times to survey items were captured at the respondent’s web browser (client side) by implementing JavaScript code on each survey page. In contrast to response times collected at the web server, client-side response times represent more precise measures of the response process as they do not include downloading times. Multilevel models are applied to take into account that item-level response times are cross-classified by survey questions and respondents. Assuming that the effect of respondent characteristics is constant over items and the effect of item properties is constant across respondents, a set of random-intercept fixed slope models is fitted. Starting from an unconditional model without any covariates, predictors on the respondent level and the item level as well as cross-level interactions are successively included as fixed effects in the model to account for the observed variation in response times.
Results: The analysis shows that the respondent’s age, education, Internet experience and motivation are significant predictors of response times. Respondents who are younger than 65, have A-levels or Vocational A-levels, use the Internet frequently and are less motivated in survey participation need less time to complete items in web surveys. Survey participants using smartphones or tablets to complete the survey have longer response times than participants who use desktop computers or laptops. Since the questionnaire of the GESIS Online Panel Pilot was not optimised for mobile devices, mobile respondents may have had problems in reading questions or selecting the appropriate response options. Among item properties, the complexity of questions and the format of response options were found to have an impact on response times: Survey items with long question texts, many response options and open-ended questions are associated with longer response times, compared to less complex questions with closed-ended response formats. When comparing the response times to a survey evaluation item across waves which was asked at the end of each survey, it could be shown that respondents become faster in the course of the panel study. Several results were contrary to prior expectations: The response times of respondents with survey experience who completed at least one survey in the previous year do not differ significantly from the response times of inexperienced survey participants. The position of the item within the questionnaire does not significantly influence survey length, which implies that respondents do not speed up in the course of the survey. Factual questions, not attitude questions, induce the longest response times among all questions types. Moreover, it was found that sensitive items are completed faster than items not dealing with sensitive topics. This finding could be explained by the fact that survey participants may answer sensitive items less thoroughly or may tend to skip these questions. None of the cross-level interaction effects were found to be significant predictors of response times. Apart from substantive variables, a set of paradata variables which describe the process of questionnaire navigation were included in the model to account for variation in response times. Respondents who scroll horizontally or vertically on the survey page need more time to complete the item on that page. Assuming that response times are an indicator of respondent burden, this finding implies that survey pages have to be adapted to the respondent’s screen size to avoid scrolling. If survey participants leave the survey window, for example to access another webpage in the browser, or revisit survey pages to edit their response, this also has an effect on response times.
Added Value & Limitations: The present analysis could replicate many findings from previous studies on response times using a probability-based online panel where response times were collected at the client side. However, some results were not in line with previous research and need to be investigated by future studies. Beyond replication, the study contributes to existing research by examining response times in the context of a panel study and demonstrating that respondents speed up across survey waves. Furthermore, the analysis indicated that paradata describing the process of questionnaire navigation are significant predictors of response times and should be collected and controlled for in future response time analyses. The limitations of the present study are the very low response rate of the GESIS Online Panel Pilot and the small number of observations in the analysis compared to previous studies on response times. Although the sample sizes are sufficiently large to estimate multilevel models, the statistical power of the fitted models may be reduced. Moreover, the nested structure of survey questions within waves, additional to the cross-classified structure of survey items and respondents, has not been considered due to a small group size at the level of survey waves. To address these shortcomings, response time data from online panels with larger sample sizes and a larger number of survey waves have to be analysed.
Web survey bibliography - Other (439)
- New Generation of Online Questionnaires?; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- The Analysis of Respondent’s Behavior toward Edit Messages in a Web Survey; 2016; Park, Y.
- Comparing online and telephone survey results in the context of a skin cancer prevention campaign evaluation...; 2016; Hollier, L.P.; Pettigrew, S.; Slevin, T.; Strickland, M.; Minto, C.
- Sample Representation and Substantive Outcomes Using Web With and Without Incentives Compared to Telephone...; 2016; Lipps, O.; Pekari, N.
- The Dynamic Identity Fusion Index: A New Continuous Measure of Identity Fusion for Web-Based Questionnaires...; 2016; Jimenez, J.; Gomez, A.; Buhrmester, M.; Whitehouse, H.; Swann, W. B.
- Collecting Data from mHealth Users via SMS Surveys: A Case Study in Kenya; 2016; Johnson, D.
- When Should I Call You? An Analysis of Differences in Demographics and Responses According to Respondents...; 2016; Vicente, P.; Lopes, I.
- The use and positioning of clarification features in web surveys; 2016; Metzler, A., Kunz, T., Fuchs, M.
- Electronic and paper based data collection methods in library and information science research: A comparative...; 2016; Tella, A.
- Identifying Pertinent Variables for Nonresponse Follow-Up Surveys. Lessons Learned from 4 Cases in Switzerland...; 2016; Vandenplas, C.; Joye, D.; Staehli, M. E.; Pollien, A.
- A Technical Guide to Effective and Accessible web Surveys; 2016; Baatard, G.
- A Framework of Incorporating Thai Social Networking Data in Online Marketing Survey; 2016; Jiamthapthaksin, R.; Aung, T. H.; Ratanasawadwat, N.
- Internet Abusive Use Questionnaire: Psychometric properties; 2016; Calvo-Frances, F.
- The impact of academic sponsorship on Web survey dropout and item non-response; 2016; Allen, P. J.; Roberts, L. D.
- A Statistical Approach to Provide Individualized Privacy for Surveys; 2016; Esponda, F.; Huerta, K.; Guerrero, V. M.
- Quality of Different Scales in an Online Survey in Mexico and Colombia; 2016; Revilla, M.; Ochoa, C.
- Presentation matters: how mode effects in item non-response depend on the presentation of response options...; 2016; Zeglovits, E.; Schwarzer, S.
- Exploring Factors in Contributing Student Progress in the Open University; 2016; Arifin, M. H.
- Taking MARS Digital; 2015; Melton, E.; Krahn, J.
- A Comparison of the Effects of Face-to-Face and Online Deliberation on Young Students’ Attitudes...; 2015; Triantafillidou, A.; Yannas, P.; Lappas, G.; Kleftodimos, A.
- Doing Online Surveys: Zum Einsatz in der sozialwissenschaftlichen Raumforschung; 2015; Nadler, R.; Petzold, K.; Schoenduwe, R.
- Using Mobile Phones for High-Frequency Data Collection; 2015; Azevedo, J. P.; Ballivian, A.; Durbin, W.
- Willingness of Online Access Panel Members to Participate in Smartphone Application-Based Research; 2015; Pinter, R.
- Who Has Access to Mobile Devices in an Online Opt-in Panel? An Analysis of Potential Respondents for...; 2015; Revilla, M.; Toninelli, D.; Ochoa, C.; Loewe, G.
- Cell Phone and Face-to-face Interview Responses in Population-based Sur- veys - How Do They Compare?; 2015; Ghandour, L.; Ghandour, B.; Mahfoud, Z.; Mokdad, A.; Sibai, A. M.
- Evaluation of an Adapted Design in a Multi-device Online Panel: A DemoSCOPE Case Study; 2015; Arn, B.; Klug, S.; Kolodziejski, J.
- Web Surveys Optimized for Smartphones: Are there Differences Between Computer and Smartphone Users?; 2015; Andreadis, I.
- Validation of the new scale for measuring behaviors of Facebook users: Psycho-Social Aspects of Facebook...; 2015; Bodroza, B.; Jovanovic, T.
- Participation rates, response bias and response behaviours in the community survey of the Swiss Spinal...; 2015; Fekete, C.; Segerer, W.; Gemperli, A.; Brinkhof, M.W.G.
- Comparison of telephone RDD and online panel survey modes on CPGI scores and co-morbidities; 2015; Lee, C.-K.; Back, K.-J.; Williams, Ro. J.; Ahn, S.-S.
- Hidden Populations, Online Purposive Sampling, and External Validity: Taking off the Blindfold; 2015; Barrat, M. J.; Ferris, J. A.; Lenton, S.
- Impact of raising awareness of respondents on the measurement quality in a web survey; 2015; Revilla, M.
- Open narrative questions in PC and smartphones: is the device playing a role?; 2015; Revilla, M.; Ochoa, C.
- Can we augment web responses with telephonic responses to a graduate destination survey?; 2015; du Toit, J.
- Comparison of Internet and interview survey modes when estimating willingness to pay using choice experiments...; 2015; Mjelde, J. W.; Kim, T. K.; Lee, C.-K.
- Suggestions for international research using electronic surveys; 2015; e Silva, S. C.; Duarte, P.
- Effects of Forced Responses and Question Display Styles on Web Survey Response Rates; 2015; Tangmanee, C.; Niruttinanon, P.
- Using Internet to Recruit Immigrants with Language and Culture Barriers for Tobacco and Alcohol Use...; 2015; Carlini, B. H.; Safioti, L.; Rue, T. C.; Miles, L.
- Recruiting Online: Lessons From a Longitudinal Survey of Contraception and Pregnancy Intentions of Young...; 2015; Harris, M. L.; Loxton, D.; Wigginton, B.; Lucke, J. C.
- Recruiting for addiction research via Facebook; 2015; Thornton, L. K.; Harris, K.; Baker, A.; Johnson, M.; Kay-Lambkin, F. J.
- Can a non-probabilistic online panel achieve question quality similar to that of the European Social...; 2015; Revilla, M.; Saris, W. E.; Loewe, G.; Ochoa, C.
- The quality of responses to grid questions as used in Web questionnaires (compared with paper questionnaires...; 2015; Dominguez, J. A.; de Rada, V. D.
- What are the Links in a Web Survey Among Response Time, Quality, and Auto-Evaluation of the Efforts...; 2015; Revilla, M.; Ochoa, C.
- Impact of mixed modes on measurement errors and estimates of change in panel data; 2015; Cernat, A.
- Probabilistic Web Survey Methodology in Education Centers: An Example in Spanish Schools; 2015; Tapia, J. A., Menendez, J. A.
- Offline recruiting of young people for an online survey - what affects response rates; 2015; Zeglovits, E.
- Smartphones @work; 2015; Bittman, M.
- Comparison of different mixed-mode and face - to face surveys - response rates and costs; 2015; Ainsaar, M.; Hendrikson, R.
- Online Eye-Tracking of Dynamic Advertising Content in (Mobile) Web-Surveys; 2015; Berger, S.
- Predicting Response Times in Web Surveys; 2015; Wenz, A.